Learning a hyperplane regressor by minimizing an exact bound on the VC dimension
نویسندگان
چکیده
The capacity of a learning machine is measured by its Vapnik-Chervonenkis dimension, and learning machines with a low VC dimension generalize better. It is well known that the VC dimension of SVMs can be very large or unbounded, even though they generally yield state-of-the-art learning performance. In this paper, we show how to learn a hyperplane regressor by minimizing an exact, or Θ bound on its VC dimension. The proposed approach, termed as the Minimal Complexity Machine (MCM) Regressor, involves solving a simple linear programming problem. Experimental results show, that on a number of benchmark datasets, the proposed approach yields regressors with error rates much less than those obtained with conventional SVM regresssors, while often using fewer support vectors. On some benchmark datasets, the number of support vectors is less than one tenth the number used by SVMs, indicating that the MCM does indeed learn simpler representations.
منابع مشابه
Learning a hyperplane classifier by minimizing an exact bound on the VC dimension
The VC dimension measures the complexity of a learning machine, and a low VC dimension leads to good generalization. While SVMs produce state-of-the-art learning performance, it is well known that the VC dimension of a SVM can be unbounded; despite good results in practice, there is no guarantee of good generalization. In this paper, we show how to learn a hyperplane classifier by minimizing an...
متن کاملLearning a Fuzzy Hyperplane Fat Margin Classifier with Minimum VC dimension
The Vapnik-Chervonenkis (VC) dimension measures the complexity of a learning machine, and a low VC dimension leads to good generalization. The recently proposed Minimal Complexity Machine (MCM) learns a hyperplane classifier by minimizing an exact bound on the VC dimension. This paper extends the MCM classifier to the fuzzy domain. The use of a fuzzy membership is known to reduce the effect of ...
متن کاملA Neurodynamical System for finding a Minimal VC Dimension Classifier
The recently proposed Minimal Complexity Machine (MCM) finds a hyperplane classifier by minimizing an exact bound on the Vapnik-Chervonenkis (VC) dimension. The VC dimension measures the capacity of a learning machine, and a smaller VC dimension leads to improved generalization. On many benchmark datasets, the MCM generalizes better than SVMs and uses far fewer support vectors than the number u...
متن کاملFeature Selection through Minimization of the VC dimension
Feature selection involes identifying the most relevant subset of input features, with a view to improving generalization of predictive models by reducing overfitting. Directly searching for the most relevant combination of attributes is NP-hard. Variable selection is of critical importance in many applications, such as micro-array data analysis, where selecting a small number of discriminative...
متن کاملLearning with Spectral Kernels and Heavy-Tailed Data
Heavy-tailed data, e.g., graphs in which the degree sequence decays according to a power law, are ubiquitous in applications. In many of those applications, spectral kernels, e.g., Laplacian Eigenmaps and Diffusion Maps, are commonly-used analytic tools. We establish learnability results applicable in both settings. Our first result is an exact learning bound for learning a classification hyper...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- CoRR
دوره abs/1410.4573 شماره
صفحات -
تاریخ انتشار 2014